Goto

Collaborating Authors

 exp null




Improving the Knowledge Gradient Algorithm

Neural Information Processing Systems

The knowledge gradient (KG) algorithm is a popular policy for the best arm identification (BAI) problem. It is built on the simple idea of always choosing the measurement that yields the greatest expected one-step improvement in the estimate of the best mean of the arms.


Non-asymptotic Convergence of Training Transformers for Next-token Prediction

Neural Information Processing Systems

NTP is limited, with existing studies focusing mainly on asymptotic performance. This paper provides a fine-grained non-asymptotic analysis of the training dynamics of a one-layer transformer consisting of a self-attention module followed by a feed-forward layer.


Appendix for " Fine-Grained Theoretical Analysis of Federated Zeroth-Order Optimization "

Neural Information Processing Systems

The main notations of this paper are summarized in Table 1. Table 1: Descriptions of the main notations used in this work.Notations Descriptions N, n the total number of clients and the total sample number of each client S, S We first introduce the lemmas which will be used in our proofs. Let e be the base of the natural logarithm. The stated result in Part (b) is proved. The optimization bound is given.




Deep Bootstrap

Chang, Jinyuan, Jiao, Yuling, Kang, Lican, Shi, Junjie

arXiv.org Machine Learning

As a result, the demands for interval estimation, and consequently for its validity and precision, have experienced a sustained increase over time and are reflected in a number of recent studies. For example, in proteomics, confidence intervals are employed to assess the association between post-translational modifications and intrinsically disordered regions of proteins, validating hypotheses derived from predictive models and facilitating large-scale functional analyses (Tunyasuvunakool et al., 2021; Bludau et al., 2022). In genomic research, confidence intervals are leveraged to characterize the distribution of gene expression levels, enabling robust inferences about promoter sequence effects and genetic variability (Vaishnav et al., 2022). In the realm of environmental science, interval estimation can be used to monitor deforestation rates of forests, yielding uncertainty-aware insights critical for climate policy formulation (Bullock et al., 2020). As for social sciences, confidence intervals are utilized to evaluate relationships between socioeconomic factors, bolstering the robustness of conclusions drawn from census data (Ding et al., 2021).



A Proofs from Section 2 448 Algorithm 4: Output ˆ α null G1 (1 η

Neural Information Processing Systems

Return ˆ α We show the following generalization of Proposition 2.1. Moreover, Alg. 4 has sample complexity The sample complexity is clear so we focus on the first statement. Theorem 4.5 in [MU17]) on these events as i varies and noting that Hence recalling (A.2) above, we conclude that The other direction is similar. Using (A.2) in the same way as above, we find First we analyze the expected sample complexity. Finally Alg. 4 has sample complexity We do this using Bayes' rule.